Stochastic Variational Inference for Gaussian Process Latent Variable Models using Back Constraints
نویسندگان
چکیده
Gaussian process latent variable models (GPLVMs) are a probabilistic approach to modelling data that employs Gaussian process mapping from latent variables to observations. This paper revisits a recently proposed variational inference technique for GPLVMs and methodologically analyses the optimality and different parameterisations of the variational approximation. We investigate a structured variational distribution, that maintains information about the dependencies between hidden dimensions, and propose a mini-batch based stochastic training procedure, enabling more scalable training algorithm. This is achieved by using variational recognition models (also known as back constraints) to parameterise the variational approximation. We demonstrate the validity of our approach on a set of unsupervised learning tasks for texture images and handwritten digits.
منابع مشابه
Gaussian Processes for Big Data
We introduce stochastic variational inference for Gaussian process models. This enables the application of Gaussian process (GP) models to data sets containing millions of data points. We show how GPs can be variationally decomposed to depend on a set of globally relevant inducing variables which factorize the model in the necessary manner to perform variational inference. Our approach is readi...
متن کاملGeneric Inference in Latent Gaussian Process Models
We develop an automated variational method for inference in models with Gaussian process (gp) priors and general likelihoods. The method supports multiple outputs and multiple latent functions and does not require detailed knowledge of the conditional likelihood, only needing its evaluation as a black-box function. Using a mixture of Gaussians as the variational distribution, we show that the e...
متن کاملReliable and Scalable Variational Inference for the Hierarchical Dirichlet Process
We introduce a new variational inference objective for hierarchical Dirichlet process admixture models. Our approach provides novel and scalable algorithms for learning nonparametric topic models of text documents and Gaussian admixture models of image patches. Improving on the point estimates of topic probabilities used in previous work, we define full variational posteriors for all latent var...
متن کاملDeterministic Annealing for Stochastic Variational Inference
Stochastic variational inference (SVI) maps posterior inference in latent variable models to nonconvex stochastic optimization. While they enable approximate posterior inference for many otherwise intractable models, variational inference methods suffer from local optima. We introduce deterministic annealing for SVI to overcome this issue. We introduce a temperature parameter that deterministic...
متن کاملBlack Box Variational Inference for State Space Models
Latent variable time-series models are among the most heavily used tools from machine learning and applied statistics. These models have the advantage of learning latent structure both from noisy observations and from the temporal ordering in the data, where it is assumed that meaningful correlation structure exists across time. A few highly-structured models, such as the linear dynamical syste...
متن کامل